qualcomm ai research
- Telecommunications (0.41)
- Semiconductors & Electronics (0.41)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks (1.00)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Uncertainty > Bayesian Inference (0.46)
- Information Technology > Artificial Intelligence > Machine Learning > Learning Graphical Models > Directed Networks > Bayesian Learning (0.46)
- Telecommunications (0.41)
- Semiconductors & Electronics (0.41)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks (1.00)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Uncertainty > Bayesian Inference (0.46)
- Information Technology > Artificial Intelligence > Machine Learning > Learning Graphical Models > Directed Networks > Bayesian Learning (0.46)
Modality-Agnostic Topology A ware Localization - Supplemental Material - Farhad G. Zanjani Ilia Karmanov Hanno Ackermann Daniel Dijkman Simone Merlin Max Welling Fatih Porikli Qualcomm AI Research
Triplet sampling was implemented based on the temporal vicinity of samples. The widths of the temporal windows roughly depend on the speed of the observer in the environment. By using Euclidean distance, KNN samples (K=3) of each prototype vectors are found. Qualcomm AI Research is an initiative of Qualcomm Technologies, Inc. All 15 iGibson environments are used in the experiment.
- Telecommunications (0.82)
- Semiconductors & Electronics (0.82)
Optimized learned entropy coding parameters for practical neural-based image and video compression
Said, Amir, Pourreza, Reza, Le, Hoang
Neural-based image and video codecs are significantly more power-efficient when weights and activations are quantized to low-precision integers. While there are general-purpose techniques for reducing quantization effects, large losses can occur when specific entropy coding properties are not considered. This work analyzes how entropy coding is affected by parameter quantizations, and provides a method to minimize losses. It is shown that, by using a certain type of coding parameters to be learned, uniform quantization becomes practically optimal, also simplifying the minimization of code memory requirements. The mathematical properties of the new representation are presented, and its effectiveness is demonstrated by coding experiments, showing that good results can be obtained with precision as low as 4~bits per network output, and practically no loss with 8~bits.
- North America > United States > California > San Diego County > San Diego (0.05)
- North America > United States > Louisiana > Orleans Parish > New Orleans (0.04)
- North America > United States > California > Santa Clara County > Palo Alto (0.04)
- (2 more...)
- Information Technology > Hardware (0.34)
- Information Technology > Artificial Intelligence (0.31)
Qualcomm Touts Eight AI "Firsts"
Whenever people take photos or speak to a digital assistant using a mobile phone, they often don't realize that they just took advantage of Artificial Intelligence (AI). If they think of AI at all, it is typically in the context of Autonomous Vehicles or perhaps Facebook's (Meta's) massive data centers. While AI is becoming ubiquitous and distributed across edge devices and cloud servers, many challenges remain to realize the connected intelligent edge vision CEO Cristiano Amon has for AI to enable automated perception, reasoning, and action. For AI to enable the levels of automation and personalization Qualcomm AI Research VP Jilei Hou believes that AI hardware and software must become much smaller, faster, more efficient, lower power, and able to learn continuously at the edge in the real world. This provides the perfect complement to remote processing in the cloud, whose reach has been further advanced through Qualcomm's 5G technology.
- Telecommunications (0.98)
- Semiconductors & Electronics (0.98)
- Information Technology > Services (0.71)
- Information Technology > Artificial Intelligence (1.00)
- Information Technology > Communications > Mobile (0.94)
- Information Technology > Communications > Social Media (0.77)
Optimizing power efficiency to bring AI to the end device
Artificial intelligence (AI) is core to making our devices smarter. With billions of connected devices, the only way to manage the tremendous amount of data created by all of these interconnections is for devices to be able to make decisions at some level independently of the cloud. As devices become more complex and perform an increasing number of real-time functions, they will also need to be able to learn (known as training in AI) and adapt to specific applications, users, and environmental conditions. To meet real-time requirements and lower operating costs, these functions will need to move on-device as well. The challenge for developers is to implement intelligence in these edge devices in a way that is efficient in terms of performance and power.